Nesterov's Acceleration For Approximate Newton

نویسندگان

  • Haishan Ye
  • Zhihua Zhang
چکیده

Optimization plays a key role in machine learning. Recently, stochastic second-order methods have attracted much attention due to their low computational cost in each iteration. However, these algorithms might perform poorly especially if it is hard to approximate the Hessian well and efficiently. As far as we know, there is no effective way to handle this problem. In this paper, we resort to Nesterov’s acceleration technique to improve the convergence performance of a class of second-order methods called approximate Newton. We give a theoretical analysis that Nesterov’s acceleration technique can improve the convergence performance for approximate Newton just like for first-order methods. We accordingly propose an accelerated regularized sub-sampled Newton. Our accelerated algorithm performs much better than the original regularized sub-sampled Newton in experiments, which validates our theory empirically. Besides, the accelerated regularized sub-sampled Newton has good performance comparable to or even better than classical algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nestrov's Acceleration For Second Order Method

Optimization plays a key role in machine learning. Recently, stochastic second-order methods have attracted much attention due to their low computational cost in each iteration. However, these algorithms might perform poorly especially if it is hard to approximate the Hessian well and efficiently. As far as we know, there is no effective way to handle this problem. In this paper, we resort to N...

متن کامل

A Variational Perspective on Accelerated Methods in Optimization

Accelerated gradient methods play a central role in optimization, achieving optimal rates in many settings. Although many generalizations and extensions of Nesterov's original acceleration method have been proposed, it is not yet clear what is the natural scope of the acceleration concept. In this paper, we study accelerated methods from a continuous-time perspective. We show that there is a La...

متن کامل

On a Convex Acceleration of Newton's Method

In this study, we use a convex acceleration of Newton's method (or super-Halley method) to approximate solutions of nonlinear equations. We provide sufficient convergence conditions for this method in three space settings: real line, complex plane, and Banach space. Several applications of our results are also provided.

متن کامل

Thermo-mechanical nonlinear vibration analysis of fluid-conveying structures subjected to different boundary conditions using Galerkin-Newton-Harmonic balancing method

The development of mathematical models for describing the dynamic behaviours of fluid conveying pipes, micro-pipes and nanotubes under the influence of some thermo-mechanical parameters results into nonlinear equations that are very difficult to solve analytically. In cases where the exact analytical solutions are presented either in implicit or explicit forms, high skills and rigorous mathemat...

متن کامل

Inexact Jacobian Constraint Preconditioners in Optimization

In this paper we analyze a class of approximate constraint preconditioners in the acceleration of Krylov subspace methods fot the solution of reduced Newton systems arising in optimization with interior point methods. We propose a dynamic sparsification of the Jacobian matrix at every stage of the interior point method. Spectral analysis of the preconditioned matrix is performed and bounds on i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1710.08496  شماره 

صفحات  -

تاریخ انتشار 2017